DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...10
Hits 1 – 20 of 191

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
Cross-Lingual Dialogue Dataset Creation via Outline-Based Generation ...
BASE
Show details
3
Improving Word Translation via Two-Stage Contrastive Learning ...
BASE
Show details
4
Specializing unsupervised pretraining models for word-level semantic similarity
Lauscher, Anne [Verfasser]; Vulic, Ivan [Verfasser]; Ponti, Edoardo Maria [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
5
Towards Zero-shot Language Modeling ...
BASE
Show details
6
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
7
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
8
Combining Deep Generative Models and Multi-lingual Pretraining for Semi-supervised Document Classification ...
Zhu, Yi; Shareghi, Ehsan; Li, Yingzhen. - : arXiv, 2021
BASE
Show details
9
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
10
Parameter space factorization for zero-shot learning across tasks and languages ...
BASE
Show details
11
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
12
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
13
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Abstract: Recent work indicated that pretrained language models (PLMs) such as BERT and RoBERTa can be transformed into effective sentence and word encoders even via simple self-supervised techniques. Inspired by this line of work, in this paper we propose a fully unsupervised approach to improving word-in-context (WiC) representations in PLMs, achieved via a simple and efficient WiC-targeted fine-tuning procedure: MirrorWiC. The proposed method leverages only raw texts sampled from Wikipedia, assuming no sense-annotated data, and learns context-aware word representations within a standard contrastive learning setup. We experiment with a series of standard and comprehensive WiC benchmarks across multiple languages. Our proposed fully unsupervised MirrorWiC models obtain substantial gains over off-the-shelf PLMs across all monolingual, multilingual and cross-lingual setups. Moreover, on some standard WiC benchmarks, MirrorWiC is even on-par with supervised models fine-tuned with in-task data and sense labels. ...
Keyword: Computational Linguistics; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
URL: https://underline.io/lecture/39862-mirrorwic-on-eliciting-word-in-context-representations-from-pretrained-language-models
https://dx.doi.org/10.48448/hs20-qq06
BASE
Hide details
14
Semantic Data Set Construction from Human Clustering and Spatial Arrangement ...
Majewska, Olga; McCarthy, Diana; Van Den Bosch, Jasper JF. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
15
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
Liu, Fangyu; Vulić, I; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
16
Context vs Target Word: Quantifying Biases in Lexical Semantic Datasets ...
BASE
Show details
17
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
18
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
19
Parameter space factorization for zero-shot learning across tasks and languages
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
20
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details

Page: 1 2 3 4 5...10

Catalogues
1
0
2
0
5
0
0
Bibliographies
3
0
0
0
0
0
1
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
180
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern